What is Temperature Coefficient and how is it applied?

Temperature Coefficient is the specification that is added to the general accuracy specification when the measuring instrument is at an ambient temperature that is outside the normal operating temperature.

It is calculated by the following manner:

The specification is normally defined as percent per degree. This means for each degree outside of the normal operating temperature this percentage is added to the normal accuracy specification. In general, the Temperature Coefficient is much less than the normal accuracy. The normal operating temperature is 18 to 28 oC.

Example:

Temperature Coefficient = 0.0001%/C.

Accuracy = 20ppm of reading + 20ppm of range.

The accuracy in measuring a 0.5V signal on the 1V range for the 90 day specification is 30ppm or 0.03% or 30 microvolts. (Refer to accuracy calculation FAQ for more details.)

Now consider that the unit is at 33 oC ambient temperature. Since this is five oC outside the normal operating temperature, the Temperature Coefficient is going to apply to the normal specification.

The Temperature Coefficient is stated as 0.0001% for each oC

0.0001% = 0.000001 = 1 ppm

5 x 1ppm = 5ppm (or 0.0005%)

5ppm added to the normal specification of 30ppm is 35 ppm or 0.00035% total uncertainty. For the 0.5 volt signal being measured, this results in 35 microvolts of uncertainty.

A reported reading in the range of 0.499965 to 0.500035 will be within tolerance.